Combining Prototype Selection with Local Boosting
نویسندگان
چکیده
Real life classification problems require an investigation of relationships between features in heterogeneous data sets, where different predictive models can be more proper for different regions of the data set. A solution to this problem is the application of the local boosting of weak classifiers ensemble method. A main drawback of this approach is the time that is required at the prediction of an unseen instance as well as the decrease of the classification accuracy in the presence of noise in the local regions. In this research work, an improved version of the local boosting of weak classifiers, which incorporates prototype selection, is presented. Experimental results on several benchmark real-world data sets show that the proposed method significantly outperforms the local boosting of weak classifiers in terms of predictive accuracy and the time that is needed to build a local model and classify a test instance.
منابع مشابه
A Boosting-Based Prototype Weighting and Selection Scheme
Prototype Selection (PS), i.e., search for relevant subsets of instances, is an interesting Data Mining problem. Original studies of Hart and Gates consisted in producing stepwise a Co~e’csed or Reduced set of prototypes, evaluated using the accuracy of a Nearest Neighbor rule. We present in this paper a new approach to PS. It is inspired by a recent cl~m~ification technique known as Boosting, ...
متن کاملCombining Feature Selection and Ensemble Learning for Software Quality Estimation
High dimensionality is a major problem that affects the quality of training datasets and therefore classification models. Feature selection is frequently used to deal with this problem. The goal of feature selection is to choose the most relevant and important attributes from the raw dataset. Another major challenge to building effective classification models from binary datasets is class imbal...
متن کاملBoosting Soft-Margin SVM with Feature Selection for Pedestrian Detection
We present an example-based algorithm for detecting objects in images by integrating component-based classifiers, which automaticaly select the best feature for each classifier and are combined according to the AdaBoost algorithm. The system employs a soft-margin SVM for the base learner, which is trained for all features and the optimal feature is selected at each stage of boosting. We employe...
متن کاملAdaptive boosting techniques in heterogeneous and spatial databases
Combining multiple classifiers is an effective technique for improving classification accuracy by reducing the variance through manipulating the training data distributions. In many large-scale data analysis problems involving heterogeneous databases with attribute instability, however, standard boosting methods do not improve local classifiers (e.g. k-nearest neighbors) due to their low sensit...
متن کاملAdaptive Boosting for Spatial Functions with Unstable Driving Attributes
Combining multiple global models (e.g. back-propagation based neural networks) is an effective technique for improving classification accuracy by reducing a variance through manipulating training data distributions. Standard combining methods do not improve local classifiers (e.g. k-nearest neighbors) due to their low sensitivity to data perturbation. Here, we propose an adaptive attribute boos...
متن کامل